Learning Hand-Eye Coordination for Robotic Grasping with Deep Learning and Large-Scale Data Collection

نویسندگان

  • Sergey Levine
  • Peter Pastor
  • Alex Krizhevsky
  • Deirdre Quillen
چکیده

We describe a learning-based approach to handeye coordination for robotic grasping from monocular images. To learn hand-eye coordination for grasping, we trained a large convolutional neural network to predict the probability that task-space motion of the gripper will result in successful grasps, using only monocular camera images and independently of camera calibration or the current robot pose. This requires the network to observe the spatial relationship between the gripper and objects in the scene, thus learning hand-eye coordination. We then use this network to servo the gripper in real time to achieve successful grasps. To train our network, we collected over 800,000 grasp attempts over the course of two months, using between 6 and 14 robotic manipulators at any given time, with differences in camera placement and hardware. Our experimental evaluation demonstrates that our method achieves effective real-time control, can successfully grasp novel objects, and corrects mistakes by continuous servoing.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning Hand-Eye Coordination for Robotic Grasping with Large-Scale Data Collection

We describe a learning-based approach to hand-eye coordination for robotic grasping from monocular images. To learn hand-eye coordination for grasping, we trained a large convolutional neural network to predict the probability that task-space motion of the gripper will result in successful grasps, using only monocular camera images and independently of camera calibration or the current robot po...

متن کامل

An Integrated Simulator and Dataset that Combines Grasping and Vision for Deep Learning

Deep learning is an established framework for learning hierarchical data representations. While compute power is in abundance, one of the main challenges in applying this framework to robotic grasping has been obtaining the amount of data needed to learn these representations, and structuring the data to the task at hand. Among contemporary approaches in the literature, we highlight key propert...

متن کامل

The Effect of Pairwise Video Feedback on the Learning of Elegant Eye-Hand Coordination Skill

The present paper aimed to study the effect of pairwise video check feedback (including the observation of external pattern of skill performance and performing the skill simultaneously) on the learning on acquisition and learning of eye-hand coordination skill. Computer skill of eye-hand coordination skill was the tool used in this study. 24 subjects were randomly selected and equally divided...

متن کامل

Learning Internal Models for Eye-Hand Coordination in Reaching and Grasping

A computational model of sensorimotor transformations underlying eye-hand coordination in reaching and grasping is presented and tested in a robot-arm setup. We suggest solutions for both the problem of the missing teacher signal and for the problem of one-to-many mappings encountered when learning inverse models and controllers.

متن کامل

Deep Reinforcement Learning for Vision-Based Robotic Grasping: A Simulated Comparative Evaluation of Off-Policy Methods

In this paper, we explore deep reinforcement learning algorithms for vision-based robotic grasping. Modelfree deep reinforcement learning (RL) has been successfully applied to a range of challenging environments, but the proliferation of algorithms makes it difficult to discern which particular approach would be best suited for a rich, diverse task like grasping. To answer this question, we pro...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1603.02199  شماره 

صفحات  -

تاریخ انتشار 2016